1) when the antenna's of 2 GPS units are separated more than 3 ft, you can get heading information.
2) calibrate your compass once PER robot whenever something on the robot changes (wiring, components, ...)
3) keep all wires parallel and twist wire-pairs wherever possible.
4) once a gyrocompass is up and running, it is not affected by anything.
5) a dip needle is used on the North pole to find the Magnetic North.
6) The Variation is the angle between the True North and the Magnetic North. For California, this angle is currently 13.4 degrees.
7) The heading information produced by a GPS is not correct. It becomes more accurate when the GPS unit travels faster than 1 mph which is already reasonably fast for a robot. On robots, it is suggested to add a compass sensor. Interesting anecdote: in the DARPA Grand Challenge, contestants that used the GPS compass heading drove in the K-rail at start because of the initial incorrect heading.
beware that the parameters are (y,x) and not (x,y).
9) a GPS can guide a robot to within 10 ft of a target. After that, vision techniques is the most common technique to get it on target, e.g. to find the orange cone in a RoboMagellan contest.
10) for mobile robotics, it is recommended to use odometry for steering but regularly re-align with GPS for location-correction and compass for heading-correction. gyroscope can help with stabilized heading.
11) VisualGPS is a software utility to help with GPS navigation.
Business
next months contest: Mini-Sumo
next months class: Paralax Propeller
December Class: Bruce Weimer - navigation using vision
December contest: Talent Show
There was a short discussion on making the Sumo contest more fair by abiding by the weight and size rules more. This gives everyone a fair chance. However, everyone agrees that this excludes some people that have a robot could compete but the limitiations would exclude them. It was agreed upon to have future sumobot contest with 2 categories: strict weight/size limit and an open class to allow other robots to compete.
Our sister club, the Riverside Robotics Club organizes a Robot Expo on Saturday November 6th between noon to 4pm at 16625 Krameria in Riverside at the Woodcrest Library.
Show & Tell
John Davis gave us an update on his RoboMagellan bot:
1) he switched to a Netbook computer instead of the Beagleboard. This because his Devantech I2C compass did not work with his Beagleboard's USB port. This problem is fixed in later models of the Beagleboard but unfortunately not for his.
2) He calibrates his compass by pointing his robot to the 4 Magnetic directions and pushing a button each time. Very simple. This only has to be done once, until the robot configuration changes in the future.
3) The heading values he receives from his compass are a BYTE, which gives values between 0 (NORTH) and 255 (NORTH, 360 degrees later). You have to make provisions in your software for the sudden jump from 255 to 0.
Martin Mason is working on a very similar RoboMagellan architecture:
Martin also showed us his Compass GUI app which he developed in Python:
Remarkable fact is that all members currently working on Robomagellan platforms have chosen the USB interface as their preferred interface between the sensors and the computer/microcontroller.
He also showed a freeware program called: Serial Chart. It takes an incoming stream of data from the COM port and visualizes this in real-time.
The data that you see in the screenshot is captured from a 5DOF IMU from Sparkfun. It interfaces to the PC via an Arduino board that filters the data using a low-pass filter. The algorithm works something like:
loop Read NewValue FilterOutput = FilterOutput * FilterTimeConstant + NewValue * (1 - FilterTimeConstant) Pause to make loop time a pretty accurate value. end loop
For better accuracy, you can combine it with a gyroscope and filter it with a Kalman filter, but that is already pretty advances.
Bruce showed us an interesting add-on to RoboRealm, to recognize objects.
It also has a mode to train a robot to follow a path. The Roborealm NOVA Gate module will take snapshops along the way and try to guess its location when detecting the snapshots later on. The robot will try to drive through the virtual gates that were defined during the training pahse.
Here is a video, Bruce posted where he demonstrates his Leaf robot navigating a hallway using the AVM/Roborealm module.
Here is a video showing Leaf robot navigation using AVM Nova Gate module.
Announcements
Our Next Meeting - Saturday January 10th at 10am
10:00-11:00 leafproject.org AI LISP Class
11:00-12:00 Class - Arduino Libraires by Sergei
12:00-12:20 Break
12:20-12:45 Business Meeting
Robot Get Me a Drink Contest
Show-N-Tell - showoff your robots!
Tour of the Long Beach Seabase - by Patricia (tentative)